Neural Machine Translation by Fusing Key Information of Text

نویسندگان

چکیده

When the Transformer proposed by Google in 2017, it was first used for machine translation tasks and achieved state of art at that time. Although current neural model can generate high quality results, there are still mistranslations omissions key information long sentences. On other hand, most important part traditional is information. In as translated accurately completely, even if parts results incorrect, final results’ be guaranteed. order to solve problem mistranslation missed effectively, improve accuracy completeness sentence translation, this paper proposes a fused based on Transformer. The extracts keywords source language text separately input encoder. After same encoding text, with output encoded encoder, then processed into decoder. With incorporating keyword from sentence, model’s performance task translating sentences very reliable. verify effectiveness method fusion paper, series experiments were carried out verification set. experimental show Bilingual Evaluation Understudy (BLEU) score Workshop Machine Translation (WMT) 2017 test dataset higher than BLEU WMT2017 dataset. advantages paper.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Information-Propogation-Enhanced Neural Machine Translation by Relation Model

Even though sequence-to-sequence neural machine translation (NMT) model have achieved state-of-art performance in the recent fewer years, but it is widely concerned that the recurrent neural network (RNN) units are very hard to capture the long-distance state information, which means RNN can hardly find the feature with long term dependency as the sequence becomes longer. Similarly, convolution...

متن کامل

Neural Machine Translation Advised by Statistical Machine Translation

Neural Machine Translation (NMT) is a new approach to machine translation that has made great progress in recent years. However, recent studies show that NMT generally produces fluent but inadequate translations (Tu et al. 2016; He et al. 2016). This is in contrast to conventional Statistical Machine Translation (SMT), which usually yields adequate but non-fluent translations. It is natural, th...

متن کامل

Neural Machine Translation with Latent Semantic of Image and Text

Although attention-based Neural Machine Translation have achieved great success, attention-mechanism cannot capture the entire meaning of the source sentence because the attention mechanism generates a target word depending heavily on the relevant parts of the source sentence. The report of earlier studies has introduced a latent variable to capture the entire meaning of sentence and achieved i...

متن کامل

Key-value Attention Mechanism for Neural Machine Translation

In this paper, we propose a neural machine translation (NMT) with a key-value attention mechanism on the source-side encoder. The key-value attention mechanism separates the source-side content vector into two types of memory known as the key and the value. The key is used for calculating the attention distribution, and the value is used for encoding the context representation. Experiments on t...

متن کامل

Neural Name Translation Improves Neural Machine Translation

In order to control computational complexity, neural machine translation (NMT) systems convert all rare words outside the vocabulary into a single unk symbol. Previous solution (Luong et al., 2015) resorts to use multiple numbered unks to learn the correspondence between source and target rare words. However, testing words unseen in the training corpus cannot be handled by this method. And it a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computers, materials & continua

سال: 2023

ISSN: ['1546-2218', '1546-2226']

DOI: https://doi.org/10.32604/cmc.2023.032732